Svms, Score-spaces and Maximum Margin Statistical Models
نویسنده
چکیده
There has been significant interest in developing new forms of acoustic model, in particular models which allow additional dependencies to be represented than allowed within a standard hidden Markov model (HMM). This paper discusses one such class of models, augmented statistical models. Here a locally exponential approximation is made about some point on a base distribution. This allows additional dependencies within the data to be modelled than are represented in the base distribution. Augmented models based on Gaussian mixture models (GMMs) and HMMs are briefly described. These augmented models are then related to generative kernels, one approach used for allowing support vector machines (SVMs) to be applied to variable length data. The training of augmented statistical models within an SVM, generative kernel, framework is then discussed. This may be viewed as using maximum margin training to estimate statistical models. Augmented Gaussian mixture models are then evaluated using rescoring on a large vocabulary speech recognition task.
منابع مشابه
Large Margin Training of Acoustic Models for Speech Recognition
LARGE MARGIN TRAINING OF ACOUSTIC MODELS FOR SPEECH RECOGNITION Fei Sha Advisor: Prof. Lawrence K. Saul Automatic speech recognition (ASR) depends critically on building acoustic models for linguistic units. These acoustic models usually take the form of continuous-density hidden Markov models (CD-HMMs), whose parameters are obtained by maximum likelihood estimation. Recently, however, there ha...
متن کاملTraining Augmented Models Using SVMs
There has been significant interest in developing new forms of acoustic model, in particular models which allow additional dependencies to be represented than those contained within a standard hidden Markov model (HMM). This paper discusses one such class of models, augmented statistical models. Here, a local exponential approximation is made about some point on a base model. This allows additi...
متن کاملUsing SVMs with randomised feature spaces: an extreme learning approach
Extreme learning machines are fast models which almost compare to standard SVMs in terms of accuracy, but are much faster. However, they optimise a sum of squared errors whereas SVMs are maximum-margin classifiers. This paper proposes to merge both approaches by defining a new kernel. This kernel is computed by the first layer of an extreme learning machine and used to train a SVM. Experiments ...
متن کاملDiscriminative Learning via Semidefinite Probabilistic Models
Discriminative linear models are a popular tool in machine learning. These can be generally divided into two types: linear classifiers, such as support vector machines (SVMs), which are well studied and provide stateof-the-art results, and probabilistic models such as logistic regression. One shortcoming of SVMs is that their output (known as the ”margin”) is not calibrated, so that it is diffi...
متن کاملMax-Margin Markov Networks
In typical classification tasks, we seek a function which assigns a label to a single object. Kernel-based approaches, such as support vector machines (SVMs), which maximize the margin of confidence of the classifier, are the method of choice for many such tasks. Their popularity stems both from the ability to use high-dimensional feature spaces, and from their strong theoretical guarantees. Ho...
متن کامل